321 research outputs found

    Geodatabase development and GIS based analysis for resource assessment of placer platinum in the offshore region of Goodnews Bay, Alaska

    Get PDF
    Thesis (M.S.) University of Alaska Fairbanks, 2006Goodnews Bay, southwest Alaska, is known for extensive Pt reserves that have their source in the neighboring Red Mountain. The reserves potentially extend offshore into the Bering Sea. This study aims at developing a geodatabase to integrate all offshore platinum related data collected by researchers and agencies in the past, with the intent to identify data gaps. Based on these data gaps 49 new areas were sampled for Pt and geophysical data were collected in summer 2005. Spatial distribution map for offshore Pt was created using a new Multiple Regression Pattern Recognition Technique (MRPRT) that gave an R²=0.76, a significant improvement from standard GIS based geospatial techniques. Four potential Pt exploration areas were delineated, including one area where drowned ultramafics and buried alluvial channels co-occur. Coastal currents influenced the surficial platinum accumulations, and no clear relation between Pt distribution and sand bars in the far offshore could be established

    Ultimate Order Statistics-Based Prototype Reduction Schemes

    Get PDF
    The objective of Prototype Reduction Schemes (PRSs) and Border Identification (BI) algorithms is to reduce the number of training vectors, while simultaneously attempting to guarantee that the classifier built on the reduced design set performs as well, or nearly as well, as the classifier built on the original design set. In this paper, we shall push the limit on the field of PRSs to see if we can obtain a classification accuracy comparable to the optimal, by condensing the information in the data set into a single training point. We, indeed, demonstrate that such PRSs exist and are attainable, and show that the design and implementation of such schemes work with the recently-introduced paradigm of Order Statistics (OS)-based classifiers. These classifiers, referred to as Classification by Moments of Order Statistics (CMOS) is essentially anti-Bayesian in its modus operandus. In this paper, we demonstrate the power and potential of CMOS to yield single-element PRSs which are either “selective” or “creative”, where in each case we resort to a non-parametric or a parametric paradigm respectively. We also report a single-feature single-element creative PRS. All of these solutions have been used to achieve classification for real-life data sets from the UCI Machine Learning Repository, where we have followed an approach that is similar to the Naïve-Bayes’ (NB) strategy although it is essentially of an anti-Naïve-Bayes’ paradigm. The amazing facet of this approach is that the training set can be reduced to a single pattern from each of the classes which is, in turn, determined by the CMOS features. It is even more fascinating to see that the scheme can be rendered operational by using the information in a single feature of such a single data point. In each of these cases, the accuracy of the proposed PRS-based approach is very close to the optimal Bayes’ bound and is almost comparable to that of the SVM

    A Practical Approach for Implementing the Probability of Liquefaction in Performance Based Design

    Get PDF
    Empirical Liquefaction Models (ELMs) are the usual approach for predicting the occurrence of soil liquefaction. These ELMs are typically based on in situ index tests, such as the Standard Penetration Test (SPT) and Cone Penetration Test (CPT), and are broadly classified as deterministic and probabilistic models. The deterministic model provides a “yes/no” response to the question of whether or not a site will liquefy. However, Performance-Based Earthquake Engineering (PBEE) requires an estimate of the probability of liquefaction (PL) which is a quantitative and continuous measure of the severity of liquefaction. Probabilistic models are better suited for PBEE but are still not consistently used in routine engineering applications. This is primarily due to the limited guidance regarding which model to use, and the difficulty in interpreting the resulting probabilities. The practical implementation of a probabilistic model requires a threshold of liquefaction (THL). The researchers who have used probabilistic methods have either come up with subjective THL or have used the established deterministic curves to develop the THL. In this study, we compare the predictive performance of the various deterministic and probabilistic ELMs within a quantitative validation framework. We incorporate estimated costs associated with risk as well as with risk mitigation to interpret PL using precision and recall and to, compute the optimal THL using Precision- Recall (P-R) cost curve. We also provide the P-R cost curves for the popular probabilistic model developed using Bayesian updating for SPT and CPT data by Cetin et al. (2004) and Moss et al. (2006) respectively. These curves should be immediately useful to a geotechnical engineer who needs to choose the optimal THL that incorporates the costs associated with the risk of liquefaction and the costs associated with mitigation

    Remote sensing for energy resources: Introduction

    Get PDF

    Post-eruption deformation processes measured using ALOS-1 and UAVSAR InSAR at Pacaya Volcano, Guatemala

    Get PDF
    Pacaya volcano is a persistently active basaltic cone complex located in the Central American Volcanic Arc in Guatemala. In May of 2010, violent Volcanic Explosivity Index-3 (VEI-3) eruptions caused significant topographic changes to the edifice, including a linear collapse feature 600 m long originating from the summit, the dispersion of ~20 cm of tephra and ash on the cone, the emplacement of a 5.4 km long lava flow, and ~3 m of co-eruptive movement of the southwest flank. For this study, Interferometric Synthetic Aperture Radar (InSAR) images (interferograms) processed from both spaceborne Advanced Land Observing Satellite-1 (ALOS-1) and aerial Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) data acquired between 31 May 2010 and 10 April 2014 were used to measure post-eruptive deformation events. Interferograms suggest three distinct deformation processes after the May 2010 eruptions, including: (1) subsidence of the area involved in the co-eruptive slope movement; (2) localized deformation near the summit; and (3) emplacement and subsequent subsidence of about a 5.4 km lava flow. The detection of several different geophysical signals emphasizes the utility of measuring volcanic deformation using remote sensing techniques with broad spatial coverage. Additionally, the high spatial resolution of UAVSAR has proven to be an excellent compliment to satellite data, particularly for constraining motion components. Measuring the rapid initiation and cessation of flank instability, followed by stabilization and subsequent influence on eruptive features, provides a rare glimpse into volcanic slope stability processes. Observing these and other deformation events contributes both to hazard assessment at Pacaya and to the study of the stability of stratovolcanoes

    Assessment of post-wildfire debris flow occurrence using classifier tree

    Get PDF
    Besides the dangers of an actively burning wildfire, a plethora of other hazardous consequences can occur afterwards. Debris flows are among the most hazardous of these, being known to cause fatalities and extensive damage to infrastructure. Although debris flows are not exclusive to fire affected areas, a wildfire can increase a location’s susceptibility by stripping its protective covers like vegetation and introducing destabilizing factors such as ash filling soil pores to increase runoff potential. Due to the associated dangers, researchers are developing statistical models to isolate susceptible locations. Existing models predominantly employ the logistic regression algorithm; however, previous studies have shown that the relationship between the predictors and response is likely better predicted using nonlinear modeling. We therefore propose the use of nonlinear C5.0 decision tree algorithm, a simple yet robust algorithm that uses inductive inference for categorical data modelling. It employs a tree-like decision making system that makes conditional statements to split data into homogeneous classes. Our results showed the C5.0 approach to produce stable and higher validation metrics in comparison to the logistic regression. A sensitivity of 81% and specificity of 78% depicts improved predictive capability and gives credence to the hypothesis that data relationships are likely nonlinear

    Optimisation of foliar application of zinc and boron in small cardamom (Elettaria cardamomum Maton)

    Get PDF
    A field experiment was conducted at Indian Cardamom Research Institute, Spices Board, Myladumpara, Idukki district, Kerala during 2006-09 to study the response of foliar application of zinc and boron on growth, yield and its content in index leaves in small cardamom. The experiment was laid out in randomized block design with twelve treatments replicated thrice. The treatments were various levels of zinc (0.1, 0.25, 0.5, 0.75 and 0.9 %) as zinc sulphate and boron (0.2, 0.4, 0.6, 0.8, 1.0 and 1.2 %) as borax with a control. The zinc content in the leaves of zinc treated plants ranged from 53.79 mg kg-1 to 116.67 mg kg-1. The boron content in leaves of the boron treated plants ranged from 20.62 mg kg-1 to 34.37 mg kg-1. The DTPA extractable zinc in soil was 0.756 to 0.917 mg kg-1 in zinc treatments and 0.93 mg kg-1 in control plot. Hot water extractable boron in soil ranged between 0.90 mg kg-1 to 2.2 mg kg-1 in boron treatments and 0.850 mg kg-1 in control plot. Application of boron at 0.6 and 0.8 % significantly improved the yield of cardamom compared to control. A significant quadratic relationship was established between yield and various levels of zinc and the quadratic curve gives the zinc optimum dose as 0.38 %. The yield attributing characters like number of panicles per clump and number of racemes per panicle were positively influenced by the foliar application of zinc and boron

    Improving automated global detection of volcanic SO2 plumes using the Ozone Monitoring Instrument (OMI)

    Get PDF
    Volcanic eruptions pose an ever-present threat to human populations around the globe, but many active volcanoes remain poorly monitored. In regions where ground-based monitoring is present the effects of volcanic eruptions can be moderated through observational alerts to both local populations and service providers such as air traffic control. However, in regions where volcano monitoring is limited satellite-based remote sensing provides a global data source that can be utilized to provide near real time identification of volcanic activity. This paper details the development of an automated volcanic plume detection method utilizing daily, global observations of sulphur dioxide (SO2) by the Ozone Monitoring Instrument (OMI) on NASA’s Aura satellite. Following identification and classification of known volcanic eruptions in 2005-2009, the OMI SO2 data are analysed using a logistic regression analysis which permits the identification of volcanic events with an overall accuracy of over 80%, and consistent plume identification when the volcanic plume SO2 loading exceeds ~400 tons. The accuracy and minimal user input requirements of the developed procedure provide a basis for the creation of an automated SO2 alert system providing volcanic alerts in regions where ground based volcano monitoring capabilities are limited. The technique could easily be adapted for use with satellite measurements of volcanic SO2 emissions from other platforms

    Evaluation of photogrammetry and inclusion of control points: Significance for infrastructure monitoring

    Get PDF
    Structure from Motion (SfM)/Photogrammetry is a powerful mapping tool in extracting three-dimensional (3D) models from photographs. This method has been applied to a range of applications, including monitoring of infrastructure systems. This technique could potentially become a substitute, or at least a complement, for costlier approaches such as laser scanning for infrastructure monitoring. This study expands on previous investigations, which utilize photogrammetry point cloud data to measure failure mode behavior of a retaining wall model, emphasizing further robust spatial testing. In this study, a comparison of two commonly used photogrammetry software packages was implemented to assess the computing performance of the method and the significance of control points in this approach. The impact of control point selection, as part of the photogrammetric modeling processes, was also evaluated. Comparisons between the two software tools reveal similar performances in capturing quantitative changes of a retaining wall structure. Results also demonstrate that increasing the number of control points above a certain number does not, necessarily, increase 3D modeling accuracies, but, in some cases, their spatial distribution can be more critical. Furthermore, errors in model reproducibility, when compared with total station measurements, were found to be spatially correlated with the arrangement of control points

    Monitoring the impact of groundwater pumping on infrastructure using Geographic Information System (GIS) and Persistent Scatterer Interferometry (PSI)

    Get PDF
    Transportation infrastructure is critical for the advancement of society. Bridges are vital for an efficient transportation network. Bridges across the world undergo variable deformation/displacement due to the Earth’s dynamic processes. This displacement is caused by ground motion, which occurs from many natural and anthropogenic events. Events causing deformation include temperature fluctuation, subsidence, landslides, earthquakes, water/sea level variation, subsurface resource extraction, etc. Continual deformation may cause bridge failure, putting civilians at risk, if not managed properly. Monitoring bridge displacement, large and small, provides evidence of the state and health of the bridge. Traditionally, bridge monitoring has been executed through on-site surveys. Although this method of bridge monitoring is systematic and successful, it is not the most efficient and cost-effective. Through technological advances, satellite-based Persistent Scatterer Interferometry (PSI) and Geographic Information Systems (GIS) have provided a system for analyzing ground deformation over time. This method is applied to distinguish bridges that are more at risk than others by generating models that display the displacement at various locations along each bridge. A bridge’s health and its potential risk can be estimated upon analysis of measured displacement rates. In return, this process of monitoring bridges can be done at much faster rates; saving time, money and resources. PSI data covering Oxnard, California, revealed both bridge displacement and regional ground displacement. Although each bridge maintained different patterns of displacement, many of the bridges within the Oxnard area displayed an overall downward movement matching regional subsidence trends observed in the area. Patterns in displacement-time series plots provide evidence for two types of deformation mechanisms. Long-term downward movements correlate with the relatively large regional subsidence observed using PSI in Oxnard. Thermal dilation from seasonal temperature changes may cause short-term variabilities unique to each bridge. Overall, it may be said that linking geologic, weather, and groundwater patterns with bridge displacement has shown promise for monitoring transportation infrastructure and more importantly differentiating between regional subsidence and site-specific displacements
    • …
    corecore